home *** CD-ROM | disk | FTP | other *** search
- Frequently Asked Questions (FAQS);faqs.077
-
-
-
- We generally feel that any religion that can't stand to have fun
- poked at it is in as sad shape as the corresponding kind of person.
-
- III. What kinds of neopagan are there, and where did they come from?
- Depending on who you talk to and what definitions you use, there
- are between 40,000 and 200,000 neopagans in the U.S.; the true figure
- is probably closer to the latter than the former, and the movement is
- still growing rapidly following a major `population explosion' in the
- late 1970s.
-
- The numerically largest and most influential neopagan group is
- the `Kingdom of Wicca' -- the modern witch covens. Modern witchcraft
- has nothing to do with Hollywood's images of the cackling,
- cauldron-stirring crone (though Wiccans sometimes joke about that one)
- and is actively opposed to the psychopathic Satanism that many
- Christians erroneously think of as `witchcraft'. Your author is an
- initiate Wiccan priest and coven leader of long standing.
-
- Other important subgroups include those seeking to revive Norse,
- Egyptian, Amerind, and various kinds of tribal pantheons other than
- the Greek and Celtic ones that have been incorporated into Wicca.
- These generally started out as Wiccan offshoots or have been so
- heavily influenced by Wiccan ritual technique that their people can
- work comfortably in a Wiccan circle and vice- versa.
-
- There are also the various orders of ceremonial magicians, most
- claiming to be the successors to the turn-of-the-century Golden Dawn
- or one of the groups founded by Alesteir Crowley during his brillant
- and notorious occult career. These have their own very elaborate
- ritual tradition, and tend to be more intellectual, more rigid, and
- less nature-oriented. They are sometimes reluctant to describe
- themselves as neopagans.
-
- The Discordians (and, more recently, the Discordian-offshoot
- Church of the Sub-Genius) are few in number but quite influential.
- They are the neopagan movement's sacred clowns, puncturing pretense
- and adding an essential note to the pagan festivals. Many Wiccans,
- especially among priests and priestesses, are also Discordians and
- will look you straight in the eye and tell you that the entire
- neopagan movement is a Discordian hoax...
-
- Neopaganism used to be largely a white, upper-middle-class
- phenomenon, but that has been changing during the last ten years. So
- called `new-collar' workers have come in in droves during the
- eighties. We still see fewer non-whites, proportionately, than there
- are in the general population, but that is also changing (though more
- slowly). With the exception of a few nut-fringe `Aryan' groups
- detested by the whole rest of the movement, neopagans are actively
- anti-racist; prejudice is not the problem, it's more that the ideas
- have tended to be accepted by the more educated segments of society
- first, and until recently those more educated segments were mostly
- white.
-
- On the East Coast, a higher-than-general-population percentage of
- neopagans have Roman Catholic or Jewish backgrounds, but figures
- suggest this is not true nationwide. There is also a very significant
- overlap in population with science-fiction fandom and the Society for
- Creative Anachronism.
-
- Politically, neopagans are distributed about the same as the
- general population, except that whether liberal or conservative they
- tend to be more individualist and less conformist and moralistic than
- average. It is therefore not too surprising that the one significant
- difference in distribution is the presence of a good many more
- libertarians than one would see in a same-sized chunk of the general
- population (I particularly register this because I'm a libertarian
- myself, but non-libertarians have noted the same phenomenon). These
- complexities are obscured by the fact that the most politically active
- and visible neopagans are usually ex-hippie left-liberals from the
- 1960s.
-
- I think the most acute generalization made about pagans as a
- whole is Margot Adler's observation that they are mostly self-made
- people, supreme individualists not necessarily in the assertive or
- egoist sense but because they have felt the need to construct their
- own culture, their own definitions, their own religious paths, out of
- whatever came to hand rather than accepting the ones that the
- mainstream offers.
-
- IV. Where do I find out more?
- I have deliberately not said much about mythology, or specific
- religious practice or aims, or the role of magic and to what extent we
- practice and 'believe' in it. Any one of those is a topic for another
- posting; but you can get a lot of information from books. Here's a
- basic bibliography:
-
-
- Adler, Margot _Drawing_Down_the_Moon_ (Random House 1979, hc)
-
- This book is a lucid and penetrating account of who the modern
- neo-pagans are, what they do and why they do it, from a woman who
- spent almost two years doing observer-participant journalism in the
- neo-pagan community. Especially valuable because it combines an
- anthropologist's objectivity with a candid personal account of her own
- feelings about all she saw and did and how her ideas about the
- neo-pagans changed under the impact of the experiences she went
- through. Recommended strongly as a first book on the subject, and
- it's relatively easy to find. There is now a revised and expanded
- second edition available.
-
-
- Starhawk _The_Spiral_Dance_
-
- An anthology of philosophy, poetry, training exercises, ritual
- outlines and instructive anecdotes from a successful working coven.
- First-rate as an introduction to the practical aspects of magick and
- running a functioning circle. Often findable at feminist bookstores.
-
-
- Shea, Robert and Wilson, Robert Anton _Illuminatus!_ (Dell, 1975, pb)
-
- This work of alleged fiction is an incredible berserko-surrealist
- rollercoaster that _will_ bend your mind into a pretzel with an
- acid-head blitzkrieg of plausible, instructive and enlightening lies
- and a few preposterous and obscure truths. Amidst this eccentric tale
- of world-girdling conspiracies, intelligent dolphins, the fall of
- Atlantis, who _really_ killed JFK, sex, drugs, rock and roll and the
- Cosmic Giggle Factor, you will find Serious Truths about Mind, Time,
- Space, the Nature of God(dess) and What It All Means -- and also learn
- why you should on no account take them Seriously. Pay particular
- attention to Appendix Lamedh ("The Tactics of Magick"), but it won't
- make sense until you've read the rest.
-
- This was first published in 3 volumes as _The_Eye_In_The_Pyramid_,
- _The_ Golden_Apple_ and _Leviathan_, but there's now a one-volume
- trade paperback carried by most chain bookstores under SF.
-
-
- Campbell, Joseph W., _The_Masks_of_God_ (Viking Books, 1971, pb)
-
- One of the definitive analytical surveys of world mythography -- and
- readable to boot! It's in 4 volumes:
-
- I. _Primitive_Mythology_
- II. _Oriental_Mythology_
- III. _Occidental_Mythology_
- IV. _Creative_Mythology_
-
- The theoretical framework of these books is a form of pragmatic
- neo-Jungianism which has enormously influenced the neopagans (we can
- accurately be described as the practice for which Campbell and Jung
- were theorizing). Note especially his predictions in vols. I & IV of
- a revival of shamanic, vision-quest-based religious forms. The recent
- Penguin pb edition of this book should be available in the Mythology
- and Folklore selection of any large bookstore.
-
-
- Bonewits, Isaac, _Real_Magic_ (Creative Arts Books, 1979, pb)
-
- A fascinating analytical study of the psychodynamics of ritual and
- magick. This was Bonewits's Ph.D. thesis for the world's only known
- doctorate in Magic and Thaumaturgy (UC Berkeley, 1971). Hardest of
- the five to find but well worth the effort -- an enormously
- instructive, trenchant and funny book.
- --
- Eric S. Raymond = esr@snark.thyrsus.com
- Xref: bloom-picayune.mit.edu comp.ai.neural-nets:8664 news.answers:4240
- Path: bloom-picayune.mit.edu!enterpoop.mit.edu!ira.uka.de!uka!prechelt
- From: prechelt@i41s14.ira.uka.de (Lutz Prechelt)
- Newsgroups: comp.ai.neural-nets,news.answers
- Subject: FAQ in comp.ai.neural-nets -- monthly posting
- Supersedes: <nn.posting_720242283@i41s14.ira.uka.de>
- Followup-To: comp.ai.neural-nets
- Date: 28 Nov 1992 03:17:01 GMT
- Organization: University of Karlsruhe, Germany
- Lines: 1609
- Approved: news-answers-request@MIT.Edu
- Expires: 2 Jan 1993 03:18:03 GMT
- Message-ID: <nn.posting_722920683@i41s14.ira.uka.de>
- Reply-To: prechelt@ira.uka.de (Lutz Prechelt)
- NNTP-Posting-Host: i41s18.ira.uka.de
- Keywords: questions, answers, terminology, bibliography
- Originator: prechelt@i41s18
-
- Archive-name: neural-net-faq
- Last-modified: 92/11/13
-
- (FAQ means "Frequently Asked Questions")
-
- ------------------------------------------------------------------------
- Anybody who is willing to contribute any question or
- information, please email me; if it is relevant,
- I will incorporate it. But: Please format your contribution
- appropriately so that I can just drop it in.
-
- The monthly posting departs at the 28th of every month.
- ------------------------------------------------------------------------
-
- This is a monthly posting to the Usenet newsgroup comp.ai.neural-nets
- (and news.answers, where it should be findable at ANY time)
- Its purpose is to provide basic information for individuals who are
- new to the field of neural networks or are just beginning to read this
- group. It shall help to avoid lengthy discussion of questions that usually
- arise for beginners of one or the other kind.
-
- >>>>> SO, PLEASE, SEARCH THIS POSTING FIRST IF YOU HAVE A QUESTION <<<<<
- and
- >>>>> DON'T POST ANSWERS TO FAQs: POINT THE ASKER TO THIS POSTING <<<<<
-
- This posting is archived in the periodic posting archive on
- "pit-manager.mit.edu" [18.172.1.27] (and on some other hosts as well).
- Look in the anonymous ftp directory "/pub/usenet/news.answers",
- the filename is as given in 'Archive-name:' header above.
- If you do not have anonymous ftp access, you can access the archives
- by mail server as well. Send an E-mail message to
- mail-server@pit-manager.mit.edu with "help" and "index" in the body on
- separate lines for more information.
-
-
- The monthly posting is not meant to discuss any topic exhaustively.
-
- Disclaimer: This posting is provided 'as is'.
- No warranty whatsoever is expressed or implied,
- especially, no warranty that the information contained herein
- is correct or useful in any way, although both is intended.
-
- >> To find the answer of question number <x> (if present at all), search
- >> for the string "-A<x>.)" (so the answer to question 12 is at "-A12.)")
-
- And now, in the end, we begin:
-
- ============================== Questions ==============================
-
- (the short forms and non-continous numbering is intended)
- 1.) What is this newsgroup for ? How shall it be used ?
- 2.) What is a neural network (NN) ?
- 3.) What can you do with a Neural Network and what not ?
- 4.) Who is concerned with Neural Networks ?
-
- 6.) What does 'backprop' mean ?
- 7.) How many learning methods for NNs exist ? Which ?
- 8.) What about Genetic Algorithms ?
-
- 10.) Good introductory literature about Neural Networks ?
- 11.) Any journals and magazines about Neural Networks ?
- 12.) The most important conferences concerned with Neural Networks ?
- 13.) Neural Network Associations ?
- 14.) Other sources of information about NNs ?
-
- 15.) Freely available software packages for NN simulation ?
- 16.) Commercial software packages for NN simulation ?
- 17.) Neural Network hardware ?
-
- 19.) Databases for experimentation with NNs ?
-
- ============================== Answers ==============================
-
- ------------------------------------------------------------------------
-
- -A1.) What is this newsgroup for ?
-
- The newsgroup comp.ai.neural-nets is inteded as a forum for people who want
- to use or explore the capabilities of Neural Networks or Neural-Network-like
- structures.
-
- There should be the following types of articles in this newsgroup:
-
- 1. Requests
-
- Requests are articles of the form
- "I am looking for X"
- where X is something public like a book, an article, a piece of software.
-
- If multiple different answers can be expected, the person making the
- request should prepare to make a summary of the answers he/she got
- and announce to do so with a phrase like
- "Please email, I'll summarize"
- at the end of the posting.
-
- The Subject line of the posting should then be something like
- "Request: X"
-
- 2. Questions
-
- As opposed to requests, questions are concerned with something so specific
- that general interest cannot readily be assumed.
- If the poster thinks that the topic is of some general interest,
- he/she should announce a summary (see above).
-
- The Subject line of the posting should be something like
- "Question: this-and-that"
- or have the form of a question (i.e., end with a question mark)
-
- 3. Answers
-
- These are reactions to questions or requests.
- As a rule of thumb articles of type "answer" should be rare.
- Ideally, in most cases either the answer is too specific to be of general
- interest (and should thus be e-mailed to the poster) or a summary
- was announced with the question or request (and answers should
- thus be e-mailed to the poster).
-
- The subject lines of answers are automatically adjusted by the
- news software.
-
- 4. Summaries
-
- In all cases of requests or questions the answers for which can be assumed
- to be of some general interest, the poster of the request or question
- shall summarize the ansers he/she received.
- Such a summary should be announced in the original posting of the question
- or request with a phrase like
- "Please answer by email, I'll summarize"
-
- In such a case answers should NOT be posted to the newsgroup but instead
- be mailed to the poster who collects and reviews them.
- After about 10 to 20 days from the original posting, its poster should
- make the summary of answers and post it to the net.
-
- Some care should be invested into a summary:
- a) simple concatenation of all the answers is not enough;
- instead redundancies, irrelevancies, verbosities and
- errors must be filtered out (as good as possible),
- b) the answers shall be separated clearly
- c) the contributors of the individual answers shall be identifiable
- (unless they requested to remain anonymous [yes, that happens])
- d) the summary shall start with the "quintessence" of the answers,
- as seen by the original poster
- e) A summary should, when posted, clearly be indicated to be one
- by giving it a Subject line starting with "Summary:"
-
- Note that a good summary is pure gold for the rest of the newsgroup
- community, so summary work will be most appreciated by all of us.
- (Good summaries are more valuable than any moderator ! :-> )
-
- 5. Announcements
-
- Some articles never need any public reaction.
- These are called announcements (for instance for a workshop,
- conference or the availability of some technical report or
- software system).
-
- Announcements should be clearly indicated to be such by giving
- them a subject line of the form
- "Announcement: this-and-that"
-
- 6. Reports
-
- Sometimes people spontaneously want to report something to the
- newsgroup. This might be special experiences with some software,
- results of own experiments or conceptual work, or especially
- interesting information from somewhere else.
-
- Reports should be clearly indicated to be such by giving
- them a subject line of the form
- "Report: this-and-that"
-
- 7. Discussions
-
- An especially valuable possibility of Usenet is of course that of
- discussing a certain topic with hundreds of potential participants.
- All traffic in the newsgroup that can not be subsumed under one of
- the above categories should belong to a discussion.
-
- If somebody explicitly wants to start a discussion, he/she can do so
- by giving the posting a subject line of the form
- "Start discussion: this-and-that"
- (People who react on this, please remove the
- "Start discussion: " label from the subject line of your replies)
-
- It is quite difficult to keep a discussion from drifting into chaos,
- but, unfortunately, as many many other newsgroups show there seems
- to be no secure way to avoid this.
- On the other hand, comp.ai.neural-nets has not had many problems
- with this effect in the past, so let's just go and hope... :->
-
- ------------------------------------------------------------------------
-
- -A2.) What is a neural network (NN) ?
-
- [anybody there to write something better?
- buzzwords: artificial vs. natural/biological; units and
- connections; value passing; inputs and outputs; storage in structure
- and weights; only local information; highly parallel operation ]
-
- First of all, when we are talking about a neural network, we *should*
- usually better say "artificial neural network" (ANN), because that is
- what we mean most of the time. Biological neural networks are much
- more complicated in their elementary structures than the mathematical
- models we use for ANNs.
-
- A vague description is as follows:
-
- An ANN is a network of many very simple processors ("units"), each
- possibly having a (small amount of) local memory. The units are
- connected by unidirectional communication channels ("connections"),
- which carry numeric (as opposed to symbolic) data. The units operate
- only on their local data and on the inputs they receive via the
- connections.
-
- The design motivation is what distinguishes neural networks from other
- mathematical techniques:
-
- A neural network is a processing device, either an algorithm, or actual
- hardware, whose design was motivated by the design and functioning of human
- brains and components thereof.
-
- Most neural networks have some sort of "training" rule
- whereby the weights of connections are adjusted on the basis of
- presented patterns.
- In other words, neural networks "learn" from examples,
- just like children learn to recognize dogs from examples of dogs,
- and exhibit some structural capability for generalization.
-
- Neural networks normally have great potential for parallelism, since
- the computations of the components are independent of each other.
-
- ------------------------------------------------------------------------
-
- -A3.) What can you do with a Neural Network and what not ?
-
- [preliminary]
-
- In principle, NNs can compute any computable function, i.e. they can
- do everything a normal digital computer can do.
- Especially can anything that can be represented as a mapping between
- vector spaces be approximated to arbitrary precision by feedforward
- NNs (which is the most often used type).
-
- In practice, NNs are especially useful for mapping problems
- which are tolerant of a high error rate, have lots of example data
- available, but to which hard and fast rules can not easily be applied.
-
- NNs are especially bad for problems that are concerned with manipulation
- of symbols and for problems that need short-term memory.
-
- ------------------------------------------------------------------------
-
- -A4.) Who is concerned with Neural Networks ?
-
- Neural Networks are interesting for quite a lot of very dissimilar people:
-
- - Computer scientists want to find out about the properties of
- non-symbolic information processing with neural nets and about learning
- systems in general.
- - Engineers of many kinds want to exploit the capabilities of
- neural networks on many areas (e.g. signal processing) to solve
- their application problems.
- - Cognitive scientists view neural networks as a possible apparatus to
- describe models of thinking and conscience (High-level brain function).
- - Neuro-physiologists use neural networks to describe and explore
- medium-level brain function (e.g. memory, sensory system, motorics).
- - Physicists use neural networks to model phenomena in statistical
- mechanics and for a lot of other tasks.
- - Biologists use Neural Networks to interpret nucleotide sequences.
- - Philosophers and some other people may also be interested in
- Neural Networks for various reasons.
-
- ------------------------------------------------------------------------
-
- -A6.) What does 'backprop' mean ?
-
- [anybody to write something similarly short,
- but easier to understand for a beginner ? ]
-
- It is an abbreviation for 'backpropagation of error' which is the
- most widely used learning method for neural networks today.
- Although it has many disadvantages, which could be summarized in the
- sentence
- "You are almost not knowing what you are actually doing
- when using backpropagation" :-)
- it has pretty much success on practical applications and is
- relatively easy to apply.
-
- It is for the training of layered (i.e., nodes are grouped
- in layers) feedforward (i.e., the arcs joining nodes are
- unidirectional, and there are no cycles) nets.
-
- Back-propagation needs a teacher that knows the correct output for any
- input ("supervised learning") and uses gradient descent on the error
- (as provided by the teacher) to train the weights. The activation
- function is (usually) a sigmoidal (i.e., bounded above and below, but
- differentiable) function of a weighted sum of the nodes inputs.
-
- The use of a gradient descent algorithm to train its weights makes it
- slow to train; but being a feedforward algorithm, it is quite rapid during
- the recall phase.
-
- Literature:
- Rumelhart, D. E. and McClelland, J. L. (1986):
- Parallel Distributed Processing: Explorations in the
- Microstructure of Cognition (volume 1, pp 318-362).
- The MIT Press.
- (this is the classic one) or one of the dozens of other books
- or articles on backpropagation :->
-
- ------------------------------------------------------------------------
-
- -A7.) How many learning methods for NNs exist ? Which ?
-
- There are many many learning methods for NNs by now. Nobody can know
- exactly how many.
- New ones (at least variations of existing ones) are invented every
- week. Below is a collection of some of the most well known methods;
- not claiming to be complete.
-
- The main categorization of these methods is the distiction of
- supervised from unsupervised learning:
-
- - In supervised learning, there is a "teacher" who in the learning
- phase "tells" the net how well it performs ("reinforcement learning")
- or what the correct behavior would have been ("fully supervised learning").
-
- - In unsupervised learning the net is autonomous: it just looks at
- the data it is presented with, finds out about some of the
- properties of the data set and learns to reflect these properties
- in its output. What exactly these properties are, that the network
- can learn to recognise, depends on the particular network model and
- learning method.
-
- Many of these learning methods are closely connected with a certain
- (class of) network topology.
-
- Now here is the list, just giving some names:
-
- 1. UNSUPERVISED LEARNING (i.e. without a "teacher"):
- 1). Feedback Nets:
- a). Additive Grossberg (AG)
- b). Shunting Grossberg (SG)
- c). Binary Adaptive Resonance Theory (ART1)
- d). Analog Adaptive Resonance Theory (ART2, ART2a)
- e). Discrete Hopfield (DH)
- f). Continuous Hopfield (CH)
- g). Discrete Bidirectional Associative Memory (BAM)
- h). Temporal Associative Memory (TAM)
- i). Adaptive Bidirectional Associative Memory (ABAM)
- j). Kohonen Self-organizing Map (SOM)
- k). Kohonen Topology-preserving Map (TPM)
- 2). Feedforward-only Nets:
- a). Learning Matrix (LM)
- b). Driver-Reinforcement Learning (DR)
- c). Linear Associative Memory (LAM)
- d). Optimal Linear Associative Memory (OLAM)
- e). Sparse Distributed Associative Memory (SDM)
- f). Fuzzy Associative Memory (FAM)
- g). Counterprogation (CPN)
-
- 2. SUPERVISED LEARNING (i.e. with a "teacher"):
- 1). Feedback Nets:
- a). Brain-State-in-a-Box (BSB)
- b). Fuzzy Congitive Map (FCM)
- c). Boltzmann Machine (BM)
- d). Mean Field Annealing (MFT)
- e). Recurrent Cascade Correlation (RCC)
- f). Learning Vector Quantization (LVQ)
- 2). Feedforward-only Nets:
- a). Perceptron
- b). Adaline, Madaline
- c). Backpropagation (BP)
- d). Cauchy Machine (CM)
- e). Adaptive Heuristic Critic (AHC)
- f). Time Delay Neural Network (TDNN)
- g). Associative Reward Penalty (ARP)
- h). Avalanche Matched Filter (AMF)
- i). Backpercolation (Perc)
- j). Artmap
- k). Adaptive Logic Network (ALN)
- l). Cascade Correlation (CasCor)
-
- ------------------------------------------------------------------------
-
- -A8.) What about Genetic Algorithms ?
-
- [preliminary]
- [Who will write a better introduction?]
-
- There are a number of definitions of GA (Genetic Algorithm).
- A possible one is
-
- A GA is an optimization program
- that starts with some encoded procedure, (Creation of Life :-> )
- mutates it stochastically, (Get cancer or so :-> )
- and uses a selection process (Darwinism)
- to prefer the mutants with high fitness
- and perhaps a recombination process (Make babies :-> )
- to combine properties of (preferably) the succesful mutants.
-
- Some GA discussion tends to happen in comp.ai.neural-nets.
- Another loosely relevant group is comp.theory.self-org-sys.
- Perhaps it is time for a comp.ai.ga, comp.theory.ga or maybe comp.ga
- There is a GA mailing list which you can subscribe to by
- sending a request to GA-List-Request@AIC.NRL.NAVY.MIL
- You can also try anonymous ftp to
- ftp.aic.nrl.navy.mil
- in the /pub/galist directory. There are papers and some software.
-
- For more details see (for example):
-
- "Genetic Algorithms in Search Optimisation and Machine Learning"
- by David Goldberg (Addison-Wesley 1989, 0-201-15767-5) or
-
- "Handbook of Genetic Algorithms"
- edited by Lawrence Davis (Van Nostrand Reinhold 1991 0-442-00173-8) or
-
- "Classifier Systems and Genetic Algorithms"
- L.B. Booker, D.E. Goldberg and J.H. Holland, Techreport No. 8 (April 87),
- Cognitive Science and Machine Intelligence Laboratory, University of Michigan
- also reprinted in :
- Artificial Intelligence, Volume 40 (1989), pages 185-234
-
- ------------------------------------------------------------------------
-
- -A10.) Good introductory literature about Neural Networks ?
-
- 0.) The best (subjectively, of course -- please don't flame me):
-
- Hecht-Nielsen, R. (1990). Neurocomputing. Addison Wesley.
- Comments: "A good book", "comprises a nice historical overview and a chapter
- about NN hardware. Well structured prose. Makes important concepts clear."
-
- Hertz, J., Krogh, A., and Palmer, R. (1991). Introduction to the Theory of
- Neural Computation. Addison-Wesley: Redwood City, California.
- Comments: "My first impression is that this one is by far the best book on
- the topic. And it's below $30 for the paperback."; "Well written, theoretical
- (but not overwhelming)"; It provides a good balance of model development,
- computational algorithms, and applications. The mathematical derivations
- are especially well done"; "Nice mathematical analysis on the mechanism of
- different learning algorithms"; "It is NOT for mathematical beginner.
- If you don't have a good grasp of higher level math, this book can
- be really tough to get through."
-
-
- 1.) Books for the beginner:
-
- Aleksander, I. and Morton, H. (1990). An Introduction to Neural Computing.
- Chapman and Hall. (ISBN 0-412-37780-2).
- Comments: "This book seems to be intended for the first year of university
- education."
-
- Beale, R. and Jackson, T. (1990). Neural Computing, an Introduction.
- Adam Hilger, IOP Publishing Ltd : Bristol. (ISBN 0-85274-262-2).
- Comments: "It's clearly written. Lots of hints as to how to get the
- adaptive models covered to work (not always well explained in the
- original sources). Consistent mathematical terminology. Covers
- perceptrons, error-backpropagation, Kohonen self-org model, Hopfield
- type models, ART, and associative memories."
-
- Dayhoff, J. E. (1990). Neural Network Architectures: An Introduction.
- Van Nostrand Reinhold: New York.
- Comments: "Like Wasserman's book, Dayhoff's book is also very easy to
- understand".
-
- McClelland, J. L. and Rumelhart, D. E. (1988).
- Explorations in Parallel Distributed Processing: Computational Models of
- Cognition and Perception (software manual). The MIT Press.
- Comments: "Written in a tutorial style, and includes 2 diskettes of NN
- simulation programs that can be compiled on MS-DOS or Unix (and they do
- too !)"; "The programs are pretty reasonable as an introduction to some
- of the things that NNs can do."; "There are *two* editions of this book.
- One comes with disks for the IBM PC, the other comes with disks for the
- Macintosh".
-